Language models conditioned on dialog state

نویسندگان

  • Karthik Visweswariah
  • Harry Printz
چکیده

We consider various techniques for using the state of the dialog in language modeling. The language models we built were for use in an automated airline travel reservation system. The techniques that we explored include (1) linear interpolation with state specific models and (2) incorporating state information using maximum entropy techniques. We also consider using the system prompt as part of the language model history. We show that using state results in about a 20% relative gain in perplexity and about a 9% percent relative gain in word error rate over a system using a language model with no information of the state.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Exploring Personalized Neural Conversational Models

Modeling dialog systems is currently one of the most active problems in Natural Language Processing. Recent advances in Deep Learning have sparked an interest in the use of neural networks in modeling language, particularly for personalized conversational agents that can retain contextual information during dialog exchanges. This work carefully explores and compares several of the recently prop...

متن کامل

Language model adaptation for spoken language systems

In a human-machine interaction (dialog) the statistical language variations are large among different stages of the dialog and across different speakers. Moreover, spoken dialog systems require extensive training data for training adaptive language models. In this paper we address the problem of open-vocabulary language models allowing the user for any possible response at each stage of the dia...

متن کامل

Spoken language variation over time and state in a natural spoken dialog system

We are interested in adaptive spoken dialog systems for automated services. Peoples’ spoken language usage varies over time for a fixed task, and furthermore varies depending on the state of the dialog. We will characterize and quantify this variation based on a database of 20K user-transactions with AT&T’s experimental ‘How May I Help You?’ spoken dialog system. We then report on a language ad...

متن کامل

Backoff Model Training using Partially Observed Data: Application to Dialog Act Tagging

Dialog act (DA) tags are useful for many applications in natural language processing and automatic speech recognition. In this work, we introduce hidden backoff models (HBMs) where a large generalized backoff model is trained, using an embedded expectation-maximization (EM) procedure, on data that is partially observed. We use HBMs as word models conditioned on both DAs and (hidden) DAsegments....

متن کامل

Dynamic language modeling using Bayesian networks for spoken dialog systems

We introduce a new framework employing statistical language models (SLMs) for spoken dialog systems that facilitates the dynamic update of word probabilities based on dialog history. In combination with traditional state-dependent SLMs, we use a Bayesian Network to capture dependencies between user goal concepts and compute accurate distributions over words that express these concepts. This all...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001